Start Year
2012

We're driven by the goal of creating computational systems that continuously respond to emotional expression. There's something special about people's bodies; how people interact via the sense of touch, how their skin changes, heart behaves, etc. We build machine learning models on biometric and touch sensors during interactive real-world contexts. Current projects are (1) using EEG systems to detect continuous affect changes during video game play; (2) detecting emotional state during talk therapy; (3) using a touch robot as an intervention for stress-related behaviours.

 

 

Publications